# RoBERTa Architecture Optimization
Xlm Roberta Base Intent Twin
MIT
XLM-RoBERTa-base is a multilingual pre-trained model based on the RoBERTa architecture, supporting Russian and English, suitable for text classification tasks.
Text Classification
Transformers Supports Multiple Languages

X
forthisdream
30
1
Bertovski
BERTovski is a large pre-trained language model based on Bulgarian and Macedonian texts, utilizing the RoBERTa architecture, and is a product of the MaCoCu project.
Large Language Model Other
B
MaCoCu
28
1
Bertweet Large
MIT
BERTweet is the first large-scale pretrained language model specifically designed for English tweets, trained on the RoBERTa architecture, suitable for social media text analysis.
Large Language Model
Transformers

B
vinai
2,853
12
Featured Recommended AI Models